Goto

Collaborating Authors

 ada lovelace


Homo Ratiocinator (Reckoning Human)

Communications of the ACM

Homo Sapiens, "wise human" in Latin, is the taxonomic species name for modern humans. But observing the current state of the world and its trajectory, it is hard for me to accept the description "wise." I am not the first to object to the "sapiens" descriptor. The French philosopher Henri-Louis Bergson argued in 1911 that a better term would be Homo Faber, referring to human tool-making ability. This ability goes back to early humans, about three million years ago. Most importantly, human tools got better and better due to innovation and cultural transmission.


The women putting intelligence in artificial intelligence

#artificialintelligence

Despite advances that have been made in women's participation in technology education and innovation over the past decade, women remain under-represented in the information technology (IT) sector and in IT-based entrepreneurial initiatives. The 2019 report I'd blush if I could published by the UNSECO is striking. It found that only 12 per cent of artificial intelligence (AI) researchers and just six percent of professional software developers are women. Without diverse perspectives and ideas, we risk developing new technologies that do not meet the needs of half the population. In fact, the European Commission's 2020 white paper into AI calls for "requirements to take reasonable measures aimed at ensuring that [the] use of AI systems does not lead to outcomes entailing prohibited discrimination."


AI Tools Like ChatGPT May Reshape Teaching Materials -- And Possibly Substitute Teach

#artificialintelligence

This summer, a coding class offered by a private school in Austin, Texas, was led by an unusual teacher. The PreK-8 school, Paragon Prep, offered a series of optional, self-paced, video lessons that were automatically generated from a textbook. In them, an animated avatar made to look like the 19th-century computing pioneer Ada Lovelace taught the basics of the Python programming language. "We'll also look at basic concepts of data analysis, using NumPy as well as Pandas," said the avatar in a female computer voice that sounds more like the iPhone's Siri than like a 19th-century British mathematician, her mouth moving clumsily as she speaks. "If you have no idea what any of that means, that's perfectly fine, good and normal. This course was meant for anyone interested in becoming a future software engineer or data scientist, not someone who is already one."


Revolutionizing the World: The Collaboration of Isaac Newton and Ada Lovelace

#artificialintelligence

Once upon a time, in an alternate universe, Isaac Newton and Ada Lovelace were brought together to work on a science invention that would change the world forever. Newton, known for his groundbreaking work in physics and mathematics, and Lovelace, known for her contributions to the field of computer science, were both renowned scientists in their own right. Their collaboration began when Newton was approached by a group of investors who were looking for a way to revolutionize the field of transportation. They wanted to create a machine that could transport people and goods faster and more efficiently than any technology that existed at the time. Newton, who had a deep understanding of the laws of motion and gravity, saw the potential in the project and agreed to work on it.


Up with Data Science, and the First Programmer

Communications of the ACM

Data science is a new interdisciplinary field of research that focuses on extracting value from data, integrating knowledge and methods from computer science, mathematics and statistics, and an application domain. Machine learning is the field at the intersection of computer science and statistics, with many applications in data science when the application domain is considered. From a historical perspective, machine learning has been considered part of artificial intelligence. It was taught mainly in computer science departments to scientists and engineers and the focus was placed on the mathematical and algorithmic aspects of machine learning, regardless of application domain. Thus, although machine learning deals also with statistics, which focuses on data and considers the application domain, until recently most machine learning activities took place in the context of computer science, where it began, and focused traditionally on algorithms.


Nvidia GeForce RTX 4090 review: Fantastically, futuristically fast

PCWorld

Nvidia's monstrous GeForce RTX 4090 delivers luxuriously fast frame rates and futuristic features, but DLSS 3's AI speed boost may be the real star. It's a behemoth of a GPU that draws a lot of power, but Nvidia's sublime Founders Edition design remains cool, quiet, and eye-catching. Nvidia's $1,599 GeForce RTX 4090 graphics card sports a luxury price tag, but you get a truly luxurious gaming experience in return. The last-generation RTX 3090 was a speed demon, but the GeForce RTX 4090--thanks to Nvidia's new "Ada Lovelace" architecture--screams through traditional games up to 83 percent faster. Nvidia's 40-series flagship chews through futuristic ray-traced games too, propelled by more advanced 3rd-gen RT cores and a radical new "Frame Generation" DLSS 3 feature that doubles (or more) frame rates yet again with a heaping helping of AI. Creators will love the 24GB of blazing-fast GDDR6X memory when it comes to pixel-packed video renders, and Nvidia's inclusion of not one, but two AV1 video encoders means the RTX 4090 is suitably equipped to handle the future of streaming. No matter what you want to accomplish, the monstrous GeForce RTX 4090 Founders Edition can handle it without breaking a sweat--though your wallet and power supply might, as all this extra performance doesn't come free. This first GPU of a new generation provides our first glimpse at Nvidia's new Ada Lovelace architecture, running on a custom TSMC 4N (read: fancy 5nm) process.


Nvidia GeForce RTX 4090 review: Fantastically, futuristically fast

PCWorld

Nvidia's monstrous GeForce RTX 4090 delivers luxuriously fast frame rates and futuristic features, but DLSS 3's AI speed boost may be the real star. It's a behemoth of a GPU that draws a lot of power, but Nvidia's sublime Founders Edition design remains cool, quiet, and eye-catching. Nvidia's $1,599 GeForce RTX 4090 graphics card sports a luxury price tag, but you get a truly luxurious gaming experience in return. The last-generation RTX 3090 was a speed demon, but the GeForce RTX 4090--thanks to Nvidia's new "Ada Lovelace" architecture--screams through traditional games up to 83 percent faster. Nvidia's 40-series flagship chews through futuristic ray-traced games too, propelled by more advanced 3rd-gen RT cores and a radical new "Frame Generation" DLSS 3 feature that doubles (or more) frame rates yet again with a heaping helping of AI. Creators will love the 24GB of blazing-fast GDDR6X memory when it comes to pixel-packed video renders, and Nvidia's inclusion of not one, but two AV1 video encoders means the RTX 4090 is suitably equipped to handle the future of streaming. No matter what you want to accomplish, the monstrous GeForce RTX 4090 Founders Edition can handle it without breaking a sweat--though your wallet and power supply might, as all this extra performance doesn't come free.


Truly creative A.I. is just around the corner. Here's why that's a big deal

#artificialintelligence

By that same logic, when Hollywood actors start tweeting about a once-obscure part of artificial intelligence (A.I.), you know that something big is happening, too. That's exactly what occurred recently when Zach Braff, the actor-director still best known for his performance as J.D. on the medical comedy series Scrubs, recorded himself reading a Scrubs-style monolog written by an A.I. "What is a hospital?" Braff reads, adopting the thoughtful tone J.D. used to wrap up each episode in the series. "A hospital is a lot like a high school: the most amazing man is dying, and you're the only one who wants to steal stuff from his dad. Being in a hospital is a lot like being in a sorority. You have greasers and surgeons. And even though it sucks about Doctor Tapioca, not even that's sad."



Who Will Design the Future? - Issue 74: Networks

Nautilus

Ada Lovelace was an English mathematician who lived in the first half of the 19th century. In 1842, Lovelace was tasked with translating an article from French into English for Charles Babbage, the "Grandfather of the Computer." Babbage's piece was about his Analytical Engine, a revolutionary new automatic calculating machine. Although originally retained solely to translate the article, Lovelace also scribbled extensive ideas about the machine into the margins, adding her unique insight, seeing that the Analytical Engine could be used to decode symbols and to make music, art, and graphics. Her notes, which included a method for calculating the Bernoulli numbers sequence and for what would become known as the "Lovelace objection," were the first computer programs on record, even though the machine could not actually be built at the time.1 Though never formally trained as a mathematician, Lovelace was able to see beyond the limitations of Babbage's invention and imagine the power and potential of programmable computers; also, she was a woman, and women in the first half of the 19th century were typically not seen as suited for this type of career. Lovelace had to sign her work with just her initials because women weren't thought of as proper authors at the time.2 Still, she persevered,3 and her work, which would eventually be considered the world's first computer algorithm, later earned her the title of the first computer programmer.